Communication-Efficient Federated Learning: A Second Order Newton-Type Method With Analog Over-the-Air Aggregation

نویسندگان

چکیده

Owing to their fast convergence, second-order Newton-type learning methods have recently received attention in the federated (FL) setting. However, current solutions are based on communicating Hessian matrices from devices parameter server, at every iteration, incurring a large number of communication rounds; calling for novel communication-efficient methods. In this article, we propose method that, similarly its first-order counterpart, requires device share only model-sized vector each iteration while hiding gradient and information. doing so, proposed approach is significantly more privacy-preserving. Furthermore, by leveraging over-the-air aggregation principle, our inherits privacy guarantees obtains much higher efficiency gains. particular, formulate problem inverse Hessian-gradient product as quadratic that solved distributed way. The framework alternates between updating using few alternating direction multipliers (ADMM) steps, global model Newton’s method. Numerical results show scalable under noisy channels different scenarios across multiple datasets.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Communication-Efficient Distributed Optimization using an Approximate Newton-type Method

We present a novel Newton-type method for distributed optimization, which is particularly well suited for stochastic optimization and learning problems. For quadratic objectives, the method enjoys a linear rate of convergence which provably improves with the data size, requiring an essentially constant number of iterations under reasonable assumptions. We provide theoretical and empirical evide...

متن کامل

Second Order Derivatives , Newton Method , Application to Shape Optimization

{ We describe a Newton method applied to the evaluation of a critical point of a total energy associated to a shape optimization problem. The key point of these methods is the Hessian of the shape functional. We give an expression of the Hessian as well as the relation with the second-order Eulerian semi-derivative. An application to the electromagnetic shaping of liquid metals process is studi...

متن کامل

Efficient Second Order Online Learning by Sketching

We propose Sketched Online Newton (SON), an online second order learning algorithm that enjoys substantially improved regret guarantees for ill-conditioned data. SON is an enhanced version of the Online Newton Step, which, via sketching techniques enjoys a linear running time. We further improve the computational complexity to linear in the number of nonzero entries by creating sparse forms of ...

متن کامل

Unsteady Magneto Hydro Dynamic Flow of a Second Order Fluid over an Oscillating Sheet with a Second Order Slip Flow Model

Unsteady slip-flow of second grade non-Newtonian electrically conducting fluid over an oscillating sheet has been considered and solved numerically. A second-order slip velocity model is used to predict the flow characteristic past the wall. With the assumption of infinite length in x-direction, velocity of the fluid can be assumed as a function of y and t, hence, with proper variable change pa...

متن کامل

An Efficient Sixth-Order Newton-Type Method for Solving Nonlinear Systems

In this paper, we present a new sixth-order iterative method for solving nonlinear systems and prove a local convergence result. The new method requires solving five linear systems per iteration. An important feature of the new method is that the LU (lower upper, also called LU factorization) decomposition of the Jacobian matrix is computed only once in each iteration. The computational efficie...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on green communications and networking

سال: 2022

ISSN: ['2473-2400']

DOI: https://doi.org/10.1109/tgcn.2022.3173420